AI-Based Anomaly Detection for Smart Factories
Mathworks discusses how AI models can be trained to help improve the performance of industrial equipment.
Latest News
October 3, 2024
Software vendors in the engineering space are incorporating artificial intelligence (AI) into design, simulation, and computer-aided manufacturing (CAM) software. AI also plays a role in digital twin solutions as well.
All of these software platforms are utilized to design so-called “smart factory” solutions that include connected machines, digital twins, and virtual models of real-world facilities. What role will AI play in the creation of these highly connected factories? We recently spoke with Rachel Johnson, MathWorks Principal Product Manager, and Philipp Wallner, MathWorks Industry Manager, about the process of developing an AI-based anomaly detection system that can help enable smart factory solutions.
Why will AI play an important role in creating smart factories and automating anomaly detection?
Rachel Johnson: We typically see this proceed through four major steps. It always starts with data. We live in this data-heavy universe, and our engineers need to start at step one. That means gathering data and defining the problem they are trying to solve. This can take a significant amount of time, and it is often underestimated. Do I have the data to support the problem I am trying to solve? Is the problem scoped thoroughly enough?
Step two is processing and exploring data, organizing it, labeling it, and getting it into a clean state before you can train and validate an AI model. If the accuracy of the trained model is not high enough for your application, you go back and ask, “Do I need to extract different features? Do I need to collect more data?” You go through this back-and-forth until they reach an AI accuracy level that works for them. We have a Predictive Maintenance Toolbox that helps with this iterative process.
Then you feed that into interactive applications to train a bunch of models to see what works best. Once that is all complete and you are happy with the results, it is time to put this into operation. These AI models are not very valuable unless they are out there in the field, deployed directly onto an embedded device or maybe in a cloud environment so you can take action.
Philipp Wallner: A lot of people are talking about AI and AI models, but the real value is in the fourth step where you take your trained model and deploy it where you need it.
We are working with a German company, Mondi, where they have a few hundred sensors on their equipment, and they used MatLab to develop some anomaly detection functionality, and then deployed it on edge devices using an industrial PC. Another one of our hardware partners, Beckhoff, is deploying AI functionality on a PLC and running in real time, 24/7, for visual inspections. [You can learn more about that application in this video.]
How can AI work in tandem with engineers to reduce the incidence of defects and optimize maintenance schedules?
Johnson: We have another customer, IMCORP, where engineers were detecting anomalies in underground power cables, looking at signal patterns and trying to identify the anomalies. It took years to learn how to do this. What they ended up doing was designing an AI-based anomaly detection system to automatically identify anomalies in those signals. This was a pretty subjective process for the engineers previously. AI helped level the playing field across different levels of expertise, and freed up time for analysts to focus on the real engineering work they wanted to do. This is a nice example of what we are seeing a lot of right now. Engineers can’t do everything manually anymore. You are going to need to have an engineer involved to ensure this is working correctly, but AI enables engineers to help scale that expertise. It’s not meant to be a replacement for engineering knowledge; it’s about how you take that expertise and scale it.
Wallner: If you look at today's equipment, and the sheer number of sensors involved, a human cannot evaluate all of that data. With this type of automated anomaly detection, the knowledge gets implemented directly on a piece of equipment and AI does that job.
If AI can be used to detect anomalies in an operation, can it help automate responses to those issues or automate mitigation processes?
Johnson: This is something I have heard customers talk about, but most of the customers i have met with are not ready for an automated response. We talk about predictive maintenance and proactive maintenance. This is a step beyond just getting accuracy of detection. Once the system is in place, there is more of a manual response to anomaly detection. Once that is happening reliably, then we can start looking at how to automate some of those responses.
What are the best practices for validating and verifying data accuracy?
Johnson: We see a lot of interest in synthetic data generation because customers don’t have enough examples of anomalous data to train an accurate model. The tie in for us is that we are a company that works with a lot of customers doing model-based design. If engineers have models of their systems that they use in the design process, you can repurpose those models to generate physics-based data for model training. This is assuming that they have validated, physics-based models, and we have operational data that can be used. Once they have validated the accuracy of the model itself, you can be fairly sure that the data that comes out of that model will be accurate. The use case for synthetic data generation from an engineering perspective is from physics-based models built in Simulink or Simscape.
This is never a one-and-done scenario. Once an AI algorithm is trained and put into operation, you have to continuously monitor it and make sure that it is continuing to represent the system it is deployed on. Continuously monitoring and validating with new data and then deciding when to retrain the model is baked into these deployments.
How familiar are your customers with these types of practical AI applications?
Wallner: I mainly work with leading companies in manufacturing, and AI has been a big topic over the past several years. It has become more practical, and one big area and theme that comes up is the notion of smart factories
We talk about software-defined manufacturing, and what we see is that in modern factories, more and more of the critical functionality is imprinted in software running directly on embedded equipment or on factory floor edge devices, or in the cloud. In the past, when we talked about factory automation, we would analyze information from sensors mounted on the equipment individually or use offline analysis with tools like Matlab.
Today there are hundreds of thousands of sensors mounted on equipment and millions of lines of software code. Those traditional methods of analysis are insufficient for finding anomalies. This is where AI methods are being used more and more to scale the impact that engineers can have with their domain expertise.
What do you see as the future state of how AI will impact smart factory deployments?
Wallner: We are at the very beginning. We do see equipment being mainly driven by software, and the role AI really plays is in having a huge potential for scale. You can take data from physical sensors, and then use that data for anomaly detection and a lot of other areas like optimization, quality, and increasing the complexity of smart factories. That is the interesting journey that is in front of us.
Johnson: I would add that I think we are seeing companies being more realistic, and realizing that AI is not a magic bullet. Engineers need to be involved, and that domain expertise is required as part of the process.
More MathWorks Coverage
Subscribe to our FREE magazine,
FREE email newsletters or both!Latest News
About the Author
Brian AlbrightBrian Albright is the editorial director of Digital Engineering. Contact him at de-editors@digitaleng.news.
Follow DE